Generalized Low Rank Models

نویسندگان

  • Madeleine Udell
  • Corinne Horn
  • Reza Bosagh Zadeh
  • Stephen P. Boyd
چکیده

Principal components analysis (PCA) is a well-known technique for approximating a data set represented by a matrix by a low rank matrix. Here, we extend the idea of PCA to handle arbitrary data sets consisting of numerical, Boolean, categorical, ordinal, and other data types. This framework encompasses many well known techniques in data analysis, such as nonnegative matrix factorization, matrix completion, sparse and robust PCA, k-means, and maximum margin matrix factorization. The method handles heterogeneous data sets, and leads to coherent schemes for compressing, denoising, and imputing missing entries across all data types simultaneously. It also admits a number of interesting interpretations of the low rank factors, which allow clustering of examples or of features. We propose a large scale, parallel algorithm for fitting generalized low rank models, and describe implementations and numerical results.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Greedy Learning of Generalized Low-Rank Models

Learning of low-rank matrices is fundamental to many machine learning applications. A state-ofthe-art algorithm is the rank-one matrix pursuit (R1MP). However, it can only be used in matrix completion problems with the square loss. In this paper, we develop a more flexible greedy algorithm for generalized low-rank models whose optimization objective can be smooth or nonsmooth, general convex or...

متن کامل

Learning of Generalized Low-Rank Models: A Greedy Approach

Learning of low-rank matrices is fundamental to many machine learning applications. A state-of-the-art algorithm is the rank-one matrix pursuit (R1MP). However, it can only be used in matrix completion problems with the square loss. In this paper, we develop a more flexible greedy algorithm for generalized low-rank models whose optimization objective can be smooth or nonsmooth, general convex o...

متن کامل

Low-rank scale-invariant tensor product smooths for generalized additive mixed models.

A general method for constructing low-rank tensor product smooths for use as components of generalized additive models or generalized additive mixed models is presented. A penalized regression approach is adopted in which tensor product smooths of several variables are constructed from smooths of each variable separately, these "marginal" smooths being represented using a low-rank basis with an...

متن کامل

Spatial Design for Knot Selection in Knot-Based Low-Rank Models

‎Analysis of large geostatistical data sets‎, ‎usually‎, ‎entail the expensive matrix computations‎. ‎This problem creates challenges in implementing statistical inferences of traditional Bayesian models‎. ‎In addition,researchers often face with multiple spatial data sets with complex spatial dependence structures that their analysis is difficult‎. ‎This is a problem for MCMC sampling algorith...

متن کامل

Nonlinearly Structured Low-Rank Approximation

Polynomially structured low-rank approximation problems occur in • algebraic curve fitting, e.g., conic section fitting, • subspace clustering (generalized principal component analysis), and • nonlinear and parameter-varying system identification. The maximum likelihood estimation principle applied to these nonlinear models leads to nonconvex optimization problems and yields inconsistent estima...

متن کامل

Low-rank Iterative Methods for Projected Generalized Lyapunov Equations

LOW-RANK ITERATIVE METHODS FOR PROJECTED GENERALIZED LYAPUNOV EQUATIONS TATJANA STYKEL Abstract. We generalize an alternating direction implicit method and the Smith method for large-scale projected generalized Lyapunov equations. Such equations arise in model reduction of descriptor systems. Low-rank versions of these methods are also presented, which can be used to compute low-rank approximat...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Foundations and Trends in Machine Learning

دوره 9  شماره 

صفحات  -

تاریخ انتشار 2016